35 research outputs found

    Development of an intelligent object for grasp and manipulation research

    Get PDF
    KÔiva R, Haschke R, Ritter H. Development of an intelligent object for grasp and manipulation research. Presented at the ICAR 2011, Tallinn, Estonia.In this paper we introduce a novel device, called iObject, which is equipped with tactile and motion tracking sensors that allow for the evaluation of human and robot grasping and manipulation actions. Contact location and contact force, object acceleration in space (6D) and orientation relative to the earth (3D magnetometer) are measured and transmitted wirelessly over a Bluetooth connection. By allowing human-human, human-robot and robot-robot comparisons to be made, iObject is a versatile tool for studying manual interaction. To demonstrate the efficiency and flexibility of iObject for the study of bimanual interactions, we report on a physiological experiment and evaluate the main parameters of the considered dual-handed manipulation task

    Towards observable haptics: Novel sensors for capturing tactile interaction patterns

    Get PDF
    KÔiva R. Towards observable haptics: Novel sensors for capturing tactile interaction patterns. Bielefeld: Bielefeld University; 2014.Touch is one of the primary senses humans use when performing coordinated interaction, but the lack of a sense of touch in the majority of contemporary interactive technical systems, such as robots, which operate in non-deterministic environments, results in interactions that can at best be described as clumsy. Observing human haptics and extracting the salient information from the gathered data is not only relevant if we are to try to understand the involved underlying cognitive processes, but should also provide us with significant clues to design future intelligent interactive systems. Such systems could one day help to take the burden of tedious tasks off our hands in a similar fashion to how industrial robots revolutionized manufacturing. The aim of the work in this thesis was to provide significant advancements in tactile sensing technology, and thus move us a step closer to realizing this goal. The contributions contained herein can be broken into two major parts. The first part investigates capturing interaction patterns in humans with the goals of better understanding manual intelligence and improving the lives of hand amputees, while the second part is focused on augmenting technical systems with a sense of touch. tacTiles, a wireless tactile sensitive surface element attached to a deformable textile, was developed to capture human full-body interactions with large surfaces we come into contact with in our daily lives, such as floors, chairs, sofas or other furniture. The Tactile Dataglove, iObject and the Tactile Pen were developed especially to observe human manual intelligence. Whereas iObject allows motion sensing and a higher definition tactile signal to be captured than the Tactile Dataglove (220 tactile cells in the first iObject prototype versus 54 cells in the glove), the wearable glove makes haptic interactions with arbitrary objects observable. The Tactile Pen was designed to measure grip force during handwriting in order to better facilitate therapeutic treatment assessments. These sensors have already been extensively used by various research groups, including our own, to gain a better understanding of human manual intelligence. The Finger-Force-Linear-Sensor and the Tactile Bracelet are two novel sensors that were developed to facilitate more natural control of dexterous multi Degree-of-Freedom (DOF) hand prostheses. The Finger-Force-Linear-Sensor is a very accurate bidirectional single finger force ground-truth measurement device that was designed to enable testing and development of single finger forces and muscle activations mapping algorithms. The Tactile Bracelet was designed with the goal to provide a more robust and intuitive means of control for multi-DOF hand prostheses by measuring the muscle bulgings of the remnant muscles of lower arm amputees. It is currently in development and will eventually cover the complete forearm circumference with high spatial resolution tactile sensitive surfaces. An experiment involving a large number of lower arm amputees has already been planned. The Modular flat tactile sensor system, the Fabric-based touch sensitive artificial skin and the 3D shaped tactile sensor were developed to cover and to add touch sensing capabilities to the surfaces of technical systems. The rapid augmentation of systems with a sense of touch was the main goal of the modular flat tactile sensor system. The developed sensor modules can be used alone or in an array to form larger tactile sensitive surfaces such as tactile sensitive tabletops. As many robots have curved surfaces, using flat rigid modules severely limits the areas that can be covered with tactile sensors. The Fabric-based tactile sensor, originally developed to form a tactile dataglove for human hands, can with minor modifications also function as an artificial skin for technical systems. Finally, the 3D shaped tactile sensor based on Laser-Direct-Structuring technology is a novel tactile sensor that has a true 3D shape and provides high sensitivity and a high spatial resolution. These sensors take us further along the path towards creating general purpose technical systems that in time can be of great help to us in our daily lives. The desired tactile sensor characteristics differ significantly according to which haptic interaction patterns we wish to measure. Large tactile sensor arrays that are used to capture full body haptic interactions with floors and upholstered furniture, or that are designed to cover large areas of technical system surfaces, need to be scalable, have low power consumption and should ideally have a low material cost. Two examples of such sensors are tacTiles and the Fabric-based sensor for curved surfaces. At the other end of the tactile sensor development spectrum, if we want to observe manual interactions, high spatial and temporal resolution are crucial to enable the measurement of fine grasping and manipulation actions. Our fingertips contain the highest density area of mechanoreceptors, the organs that sense mechanical pressure and distortions. Thus, to construct biologically inspired anthropomorphic robotic hands, the artificial tactile sensors for the fingertips require similar high-fidelity sensors with surfaces that are curved under small bending radii in 2 dimensions, have high spatial densities, while simultaneously providing high sensitivity. With the fingertip tactile sensor, designed to fit the Shadow Robot Hands' fingers, I show that such sensors can indeed be constructed in the 3D-shaped high spatial resolution tactile sensor section of my thesis. With my work I have made a significant contribution towards making haptics more observable. I achieved this by developing a high number of novel tactile sensors that are usable, give a deeper insight into human haptic interactions, have great potential to help amputees and that make technical systems, such as robots, more capable

    Online natural myocontrol of combined hand and wrist actions using tactile myography and the biomechanics of grasping

    Get PDF
    Connan M, KÔiva R, Castellini C. Online natural myocontrol of combined hand and wrist actions using tactile myography and the biomechanics of grasping. Frontiers in Neurorobotics. 2020;14: 11.Objective: Despite numerous recent advances in the field of rehabilitation robotics, simultaneous and proportional control of hand and/or wrist prostheses is still unsolved. In this work we concentrate on myocontrol of combined actions, for instance power grasping while rotating the wrist, by only using training data gathered from single actions. This is highly desirable since gathering data for all possible combined actions would be unfeasibly long and demanding for the amputee. Approach: We first investigated physiologically feasible limits for muscle activation during combined actions. Using these limits we involved 12 intact participants and one amputee in a Target Achievement Control test, showing that tactile myography, i.e. high-density force myography, solves the problem of combined actions to a remarkable extent using simple linear regression. Since real-time usage of many sensors can be computationally demanding, we compare this approach with another one using a reduced feature set. These reduced features are obtained using a fast, spatial first-order approximation of the sensor values. Main results: By using the training data of single actions only, i.e. power grasp or wrist movements, subjects achieved an average success rate of 70.0% in the target achievement test using ridge regression. When combining wrist actions, e.g. pronating and flexing the wrist simultaneously, similar results were obtained with an average of 68.1%. If a power grasp is added to the pool of actions, combined actions are much more difficult to achieve (36.1%). Significance: To the best of our knowledge, for the first time, the effectiveness of tactile myography on single and combined actions is evaluated in a target achievement test. The present study includes 3 DoFs control instead of the two generally used in the literature. Additionally, we define a set of physiologically plausible muscle activation limits valid for most experiments of this kind

    Tactile myography: an off-line assessment on intact subjects and one upper-limb disabled

    Get PDF
    Castellini C, KÔiva R, Pasluosta C, Viegas C, Eskofier BM. Tactile myography: an off-line assessment on intact subjects and one upper-limb disabled. Technologies / SI: Assistive Robotics. 2018;6(2): 38.Human-machine interfaces to control prosthetic devices still suffer from scarce dexterity and low reliability; for this reason, the community of assistive robotics is exploring novel solutions to the problem of myocontrol. In this work, we present experimental results pointing in the direction that one such method, namely Tactile Myography (TMG), can improve the situation. In particular, we use a shape-conformable high-resolution tactile bracelet wrapped around the forearm/residual limb to discriminate several wrist and finger activations performed by able-bodied subjects and a trans-radial amputee. Several combinations of features/classifiers were tested to discriminate among the activations. The balanced accuracy obtained by the best classifier/feature combination was on average 89.15% (able-bodied subjects) and 88.72% (amputated subject); when considering wrist activations only, the results were on average 98.44% for the able-bodied subjects and 98.72% for the amputee. The results obtained from the amputee were comparable to those obtained by the able-bodied subjects. This suggests that TMG is a viable technique for myoprosthetic control, either as a replacement of or as a companion to traditional surface electromyography

    Fingernail with static and dynamic force sensing

    Get PDF
    KÔiva R, Schwank T, Haschke R, Ritter H. Fingernail with static and dynamic force sensing. Presented at the Humanoids 2016 Workshop "Tactile sensing for manipulation: new progress and challenges", Cancun, Mexico.We report on our development of sensorized fingernails for mechatronical hands. Our proposed design can capture static and dynamic interaction forces with the nail and provide basic information about the direction of the main force vector. Over the course of several iterations, we have developed a very compact working prototype that fits together with our previously developed multi cell MID-based tactile fingertip sensor into the cavity of a finger of a Shadow Robot Hand, a robotic hand roughly the size of an average male hand. High sensitivity combined with robustness for daily use are the key features describing our proposed design

    Mechatronic fingernail with static and dynamic force sensing

    Get PDF
    KÔiva R, Schwank T, Walck G, Haschke R, Ritter H. Mechatronic fingernail with static and dynamic force sensing. In: International Conference on Intelligent Robots and Systems (IROS). 2018.Our fingernails help us to accomplish a variety of manual tasks, but surprisingly only a few robotic hands are equipped with nails. In this paper, we present a sensorized fingernail for mechatronic hands that can capture static and dynamic interaction forces with the nail. Over the course of several iterations, we have developed a very compact working prototype that fits together with our previously developed multi-cell tactile fingertip sensor into the cavity of the distal phalange of a human-sized robotic hand. We present the construction details, list the key performance characteristics and demonstrate an example application of finding the end of an adhesive tape roll using the signals captured by the sensors integrated in the nail. We conclude with a discussion about improvement ideas for future versions

    Feel-good robotics: requirements on touch for embodiment in assistive robotics

    Get PDF
    The feeling of embodiment, i.e., experiencing the body as belonging to oneself and being able to integrate objects into one’s bodily self-representation, is a key aspect of human self-consciousness and has been shown to importantly shape human cognition. An extension of such feelings toward robots has been argued as being crucial for assistive technologies aiming at restoring, extending, or simulating sensorimotor functions. Empirical and theoretical work illustrates the importance of sensory feedback for the feeling of embodiment and also immersion; we focus on the the perceptual level of touch and the role of tactile feedback in various assistive robotic devices. We critically review how different facets of tactile perception in humans, i.e., affective, social, and self-touch, might influence embodiment. This is particularly important as current assistive robotic devices – such as prostheses, orthoses, exoskeletons, and devices for teleoperation–often limit touch low-density and spatially constrained haptic feedback, i.e., the mere touch sensation linked to an action. Here, we analyze, discuss, and propose how and to what degree tactile feedback might increase the embodiment of certain robotic devices, e.g., prostheses, and the feeling of immersion in human-robot interaction, e.g., in teleoperation. Based on recent findings from cognitive psychology on interactive processes between touch and embodiment, we discuss technical solutions for specific applications, which might be used to enhance embodiment, and facilitate the study of how embodiment might alter human-robot interactions. We postulate that high-density and large surface sensing and stimulation are required to foster embodiment of such assistive devices

    Using a high spatial resolution tactile sensor for intention detection

    Get PDF
    Castellini C, KÔiva R. Using a high spatial resolution tactile sensor for intention detection. Presented at the 13th International Conference on Rehabilitation Robotics (ICORR 2013), Seattle, Washington, USA.Intention detection is the interpretation of biological signals with the aim of automatically, reliably and naturally understanding what a human subject desires to do. Although intention detection is not restricted to disabled people, such methods can be crucial in improving a patient's life, e.g., aiding control of a robotic wheelchair or of a self-powered prosthesis. Traditionally, intention detection is done using, e.g., gaze tracking, surface electromyography and electroencephalography. In this paper we present exciting initial results of an experiment aimed at intention detection using a high-spatial-resolution, high-dynamic-range tactile sensor. The tactile image of the ventral side of the forearm of 9 able-bodied participants was recorded during a variable-force task stimulated at the fingertip. Both the forces at the fingertip and at the forearm were synchronously recorded. We show that a standard dimensionality reduction technique (Principal Component Analysis) plus a Support Vector Machine attain almost perfect detection accuracy of the direction and the intensity of the intended force. This paves the way for high spatial resolution tactile sensors to be used as a means for intention detection

    Intention Gathering from Muscle Residual Activity for the Severely Disabled

    No full text
    Castellini C, KÔiva R. Intention Gathering from Muscle Residual Activity for the Severely Disabled. Presented at the IROS 2012, Workshop on Progress, Challenges and Future Perspectives in Navigation and Manipulation Assistance for Robotic Wheelchairs, Algarve, Portugal.In this position paper we give an overview of some possible ways of capturing patients' intentions when little muscle residual activity is present. In particular, surface electromyography, ultrasound imaging, force detection and tactile muscle movement detection are described, and their potential relevance for the severely disabled / wheelchair community is discussed. Whereas electromyography is a well-known technique, especially in prosthesis control, ultrasound imaging and tactile detection constitute an original contribution to this research community's potential applications

    Sensors for capturing tactile interaction patterns

    No full text
    KĂ”iva R, SchĂŒrmann C. Sensors for capturing tactile interaction patterns. Presented at the 7th ACM/IEEE International Conference on Human-Robot-Interaction (HRI 2012), Boston, USA.This work discusses three tactile sensor systems developed by the authors to capture interaction patterns of humans and anthropomorphic robots. The rst, tacTiles, is a exible blanket with embedded force sensors, facilitating rapid augmentation of the environment with a sense of touch and is targeted towards cognitive rooms and ambient intelligence. The second, Myrmex, is a modular 2D high-speed tactile pattern camera with USB-Video-Class interface, allowing computer vision algorithms to work on tactile data in a plug and-play fashion. Finally, intelligent Object (iObject), is a wireless instrumented tool equipped with tactile and motion capture sensors for evaluating human or robotic grasping and manipulation
    corecore